- If your AI-generated code becomes faulty, who faces the most liability exposure?
- These discoutned earbuds deliver audio so high quality, you'll forget they're mid-range
- This Galaxy Watch is one of my top smartwatches for 2024 and it's received a huge discount
- One of my favorite Android smartwatches isn't from Google or OnePlus (and it's on sale)
- The Urgent Need for Data Minimization Standards
Facebook’s Metaverse is Expanding the Attack Surface
Thirty years ago, Paramount trademarked the name “Holodeck.” An artifact of Star Trek: The Next Generation, the holodeck was a magical, computer-generated world where characters lived in another realm – either a historical place or an entirely fictious domain, based on old movies, books, or a character’s imagination. As in much science fiction, the holodeck’s inner workings were never explained, except when dealing with a malfunction: the safety protocols stopped working, an alien took over the controls, a fictional character escaped, all of which put one or more character’s lives at risk.
Also, thirty years ago, Gartner published a research report “Client Server and Cooperative Processing.” It described the underlying model behind client/server computing and described the forms simple two-tiered architectures might take. As a side effect, the report described why client/server computing makes sense (as opposed to doing everything on one machine). Different types of computers have a different ration of computational power to available data. Historically, mainframes tend to be data-rich (tuned to run at 100% processor utilization) and MIPS-poor, while PCs tend to be MIPS-rich (rarely exceeding significant processor utilization) and data-poor – by a factor of about 3,000. If the computational problem involves lots of data but relatively little processing power, a mainframe-style computer fits the bill. If the problem involves lots of processing but not much data, a PC makes sense. And if the problem requires lots of data and lots of processing, then split the problem into two parts – and put the data-heavy part on one, and the compute-intensive part on the other.
Enter the Metaverse
The holodeck is the limiting case of a computational problem requiring lots of data and lots of processing. We can be sure that it is implemented using a multi-tiered architecture. Which brings us to the metaverse, our real-world version of the holodeck. The metaverse will provide a rich, immersive experience when the user wears AR glasses and gloves with haptic feedback (local client computing for compute-intensive tasks) fronting a richly connected network of servers holding vast amounts of data about the background, landscape, avatars, and the physics of the virtual environment.
From a security perspective the metaverse presents every possible attack surface. The primary IT components connect using IP but the many devices needed to flesh out the illusion will run a multitude of industrial control system protocols. Cost pressures will drive vendors building the infrastructure to source low-cost IIoT components, which still lack basic security and privacy controls. Even in the holodeck, advanced authentication was easily forged. Man-in-the-middle attacks will proliferate. Privacy will be non-existent, because people react to sensory input faster than they know, and the local client hardware will pick up and remember those reactions. While people are exploring their virtual world, the virtual world is constantly monitoring and evaluating the individual’s likes, wants, and preferences. The mountain of profile data will make marketing vastly more persuasive, not just for consumer products but also for political advertisement targeting. Vance Packard would be in awe of the metaverse’s power.
Security conventionally guarantees that data shall not be lost, altered, or inadvertently disclosed. Adding the industrial control system mandate for safety brings us to a new model for cybersecurity fitting the threats the metaverse will unleash. Since effective cybersecurity combines technology with policy and user education, we are a long way from securing the metaverse. The architecture is just now coming to light. The proper procedures are far from a first draft, and regulations a decade behind that. For now, the strongest link remains the people using it. Be careful, and thoughtful, about what you want to share and how you would keep a secret in this new virtual world. “Arch!” doesn’t work quite yet.
ReferencesHOLODECK Trademark 74327473, filed Oct 31, 1992.
“Client/Server and Cooperative Processing – a Guide for the Perplexed,” William Malik, Tony Percy, W. Roy Schulte, Gartner, Stamford, CT. October 1992
The Hidden Persuaders, Vance Packard, David McKay Co., New York, 1957.
What do you think? Let me know @WilliamMalikTM